Training Asymptotically Stable Recurrent Neural Networks

نویسندگان

  • Nikitas J. Dimopoulos
  • John T. Dorocicz
  • Chris M. Jubien
  • Stephen W. Neville
چکیده

Abstract -In this work we present a class of recurrent fashion. For example, the axons of the granule cells become networks which are asymptotically stable. For these networks, elongated and are arranged in parallel to each other forming the we discuss their similarity with certain structures in the central Parallel Fibers. Neurons from all the classes but granule cells, renervous system, and prove that if an interconnection pattern that ceive input from the Parallel Fibers. The absence of loops (e.g. does not allow excitatory feedback is used, then the resulting regranule cells to granule cells) contributes to the stability of the current neural network is stable. We introduce a training methstructure, as we shall prove subsequently. Additionally, the arodology for networks belonging to this class, and use it to train borizations of both the dendrites and the axons are limited, and networks that successfully identify a number nonlinear systems. thus neurons are affected by neurons which are in their immedi-

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nl Q Theory: Checking and Imposing Stability of Recurrent Neural Networks for Nonlinear Modelling

It is known that many discrete time recurrent neural networks, such as e.g. neural state space models, multilayer Hoppeld networks and locally recurrent globally feedforward neural networks, can be represented as NL q systems. Suucient conditions for global asymptotic stability and input/output stability of NL q systems are available, including three types of criteria: diagonal scaling and crit...

متن کامل

Stable Rough Extreme Learning Machines for the Identification of Uncertain Continuous-Time Nonlinear Systems

‎Rough extreme learning machines (RELMs) are rough-neural networks with one hidden layer where the parameters between the inputs and hidden neurons are arbitrarily chosen and never updated‎. ‎In this paper‎, ‎we propose RELMs with a stable online learning algorithm for the identification of continuous-time nonlinear systems in the presence of noises and uncertainties‎, ‎and we prove the global ...

متن کامل

Complex Valued Recurrent Neural Network: From Architecture to Training

Recurrent Neural Networks were invented a long time ago, and dozens of different architectures have been published. In this paper we generalize recurrent architectures to a state space model, and we also generalize the numbers the network can process to the complex domain. We show how to train the recurrent network in the complex valued case, and we present the theorems and procedures to make t...

متن کامل

Discrete Recurrent Neural Networks as Pushdown Automata

in this paper we describe a new discrete rccurrcnt neural network model with discrete external stacks for learning context-free grammars (or pushdown automata). Conventional analog recurrent networks tend to have stability problems when presented with input sirings which are longer than those used for training: the network’s internal states become merged and the string can not be correctly pars...

متن کامل

Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks

Modelling and forecasting Stock market is a challenging task for economists and engineers since it has a dynamic structure and nonlinear characteristic. This nonlinearity affects the efficiency of the price characteristics. Using an Artificial Neural Network (ANN) is a proper way to model this nonlinearity and it has been used successfully in one-step-ahead and multi-step-ahead prediction of di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Intelligent Automation & Soft Computing

دوره 2  شماره 

صفحات  -

تاریخ انتشار 1996